Lecture 16: Relative-error Low-rank Matrix Approximation with Sampling and Projections
نویسنده
چکیده
Today, we will start to discuss how to improve the rather coarse additive-error low-rank matrix approximation algorithms from the last two classes to obtain much better results for low-rank matrix approximation. Importantly, “better” means very different things to different research communities, and thus we will discuss several different notions of better. We will start by describing how to improve the additive-error bounds we have been discussing to relative-error low-rank matrix approximation. Here is the reading for today.
منابع مشابه
Lecture 15 : Additive - error Low - rank Matrix Approximation with Sampling and Projections
• A spectral norm bound for reconstruction error for the basic low-rank approximation random sampling algorithm. • A discussion of how similar bounds can be obtained with a variety of random projection algorithms. • A discussion of possible ways to improve the basic additive error bounds. • An iterative algorithm that leads to additive error with much smaller additive scale. This will involve u...
متن کاملLecture 14: Additive-error Low-rank Matrix Approximation with Sampling and Projections
Today, we will shift gears and begin to discuss RandNLA algorithms for low-rank matrix approximation. We will start with additive-error low-rank matrix approximation with sampling and projections. These are of interest historically and since they illustrate several techniques (normsquared sampling, simple linear algebraic manipulations, the use of matrix perturbation theory, etc.), but they are...
متن کاملAdaptive Sampling and Fast Low-Rank Matrix Approximation
We prove that any real matrix A contains a subset of at most 4k/ + 2k log(k + 1) rows whose span “contains” a matrix of rank at most k with error only (1 + ) times the error of the best rank-k approximation of A. We complement it with an almost matching lower bound by constructing matrices where the span of any k/2 rows does not “contain” a relative (1 + )-approximation of rank k. Our existence...
متن کاملLecture 9 : Fast Random Projections and FJLT , cont
Warning: these notes are still very rough. They provide more details on what we discussed in class, but there may still be some errors, incomplete/imprecise statements, etc. in them. We continue with the discussion from last time. There is no new reading, just the same as last class. Today, we will do the following. • Show that the two structural conditions required for good LS approximation ar...
متن کاملLecture 20 : Low - rank Approximation with Element - wise Sampling
So far, we have been talking about sampling/projection of rows/columns—i.e., we have been working with the actual columns/rows or linear combinations of the columns/rows of an input matrix A. Formally, this means that we are preor post-multiplying the input matrix A with a sampling/projection/sketching operator (that itself can be represented as a matrix) to construct another matrix A′ (with di...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2015